Goto

Collaborating Authors

 databrick lakehouse


How to Build Scalable Real-time Applications on a Databricks Lakehouse with Confluent

#artificialintelligence

For many organizations, real-time data collection and data processing at scale can provide immense advantages for business and operational insights. The need for real-time data introduces technical challenges that require skilled expert experience to build custom integration for a successful real-time implementation. For customers looking to implement streaming real-time applications, our partner Confluent recently announced a new Databricks Connector for Confluent Cloud. This new fully-managed connector is designed specifically for the data lakehouse and provides a powerful solution to build and scale real-time applications such as application monitoring, internet of things (IoT), fraud detection, personalization and gaming leaderboards. Organizations can now use an integrated capability that streams legacy and cloud data from Confluent Cloud directly into the Databricks Lakehouse for business intelligence (BI), data analytics and machine learning use cases on a single platform.


Databricks announces a new portal named Databricks Partner Connect

#artificialintelligence

Databricks, the Data and AI company and pioneer of the data lakehouse architecture, today announced Databricks Partner Connect, a one-stop portal for customers to quickly discover a broad set of validated data, analytics, and AI tools and easily integrate them with their Databricks lakehouse across multiple cloud providers. Integrations with Databricks partners Fivetran, Labelbox, Microsoft Power BI, Prophecy, Rivery, and Tableau are initially available to customers, with Airbyte, Blitzz, dbt Labs, and many more to come in the months ahead. Enterprises want to drive complexity out of their data infrastructure and adopt more open technologies to take better advantage of analytics and AI. The data lakehouse enabled by Databricks has put thousands of customers on this path, collectively processing multiple exabytes of data a day on a single platform for analytics and AI workloads. But, the data ecosystem is vast, and no one vendor can accomplish everything.